From:                              route@monster.com

Sent:                               Wednesday, October 21, 2015 10:12 AM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: IBM IID Architect

 

This resume has been forwarded to you at the request of Monster User xapeix03

Thulasi M 

Last updated:  05/28/15

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Greenwood, IN  46143
US

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Thulasi_Informatica

Resume Value: 6a9pyyywb3kmvix7   

  

 

Thulasi Muthyala                                                                                                  

Sr. Informatica Developer      thulasi.muthyala@gmail.com                                                             

  

 

Personal Summary:

·   Over 8 years of IT experience in Design, Development, Implementation and Maintenance of Data warehouse applications as Developer using Informatica Power Center 9. x/8. x/ 7. x/6. x / 5.x.

·   Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes.

·   Strong expertise in designing and developing Business Intelligence solutions in staging, populating Operational Data Store (ODS), Enterprise Data Warehouse(EDW), Data Marts / Decision Support Systems using Informatica Power Center 9.x/8.x/7.x ETL tool.

·   Experience in Extraction, Transformation and Loading of data from different heterogeneous source systems like Flat files (Fixed width & Delimited), XML Files, COBOL files, VSAM, IBM DB2 UDB, Excel, Oracle, Sybase, MS SQL Server and Teradata.

·   Good understanding of Star and Snowflake Schema, Dimensional Modeling, Relational Data Modeling, Slowly Changing Dimensions and data warehousing concepts.

·   Experience in using Teradata load utilities (FASTLOAD, MULTILOAD and TPUMP) to load huge volumes of data to Teradata RDBMS.

·   Experience in working with UNIX Shell scripts and Batch scripts and necessary Test Plans to ensure the successful execution of the data loading process.

·   Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schemas.

·   Hands on experience in all aspects of Software Development Life Cycle (SDLC).

·   Exceptional problem solving and sound decision making capabilities, recognized by associates for quality of data, alternative solutions, and confident, accurate, decision making.

·   Extensive Data Warehouse experience using Informatica Power Center / Power Mart (Source Analyzer, Repository Manager, Server Manager, Mapping Designer, Mapplet Designer, Transformation Designer) as ETL tool on Oracle Database.

·   Strong Experience on Workflow Manager Tools - Task Developer, Worklet & Workflow Designer.

·   Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like targets, sources, mappings and sessions.

·   Involved in all aspects of ETL- requirement gathering, coming up with standard interfaces to be used by operational sources, data cleaning, coming up with data load strategies, designing various mappings, developing mappings, unit testing, integration testing, regression testing and UAT in development.

·   Good knowledge in interacting with Informatica Data Explorer (IDE), and Informatica Data Quality (IDQ)

·   Strong experience in coding using SQL, PL/SQL Procedures/Functions, Triggers and Packages.

·   Well versed in developing the complex queries, unions, multiple table's joins, views and sub-queries.

·   Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.

·   As a team member with an ability to perform individually, good interpersonal relations, strong communication skills, hardworking and high level of motivation.

 

 

Technical Skills:

 

 

ETL Tools

Informatica Power Center 9.x/ 8.x/7.x/6.x (Informatica Designer, Mapping Designer, Mapplet, Transformations, Workflow Manager, Work flow Monitor), Informatica Power Exchange, OLTP, OLAP.

Database

Oracle 11g/10g/9i, IBM-DB2/UDB 9.0, Teradata v2r12, MS-SQL Server2008/ 2005/2000, MySQL 4.5, Siebel Analytics, MS-Access

Data Modeling

Erwin 7.2/4.1.4, Microsoft Visio, PowerDesigner 12.5

Operating Systems

UNIX (AIX, HP-UX), Linux, MS-DOS, Windows 9x/NT/2K/XP

Languages

SQL, PL/SQL, Unix Shell scripting, UML, C, C++

Web Technology

HTML, Java Script, XML

Database Tools 

SQL plus, SQL Loader, QueryMan, Toad

Other Tools

Autosys

 

 

Professional Experience:

 

State of Indiana -Indianapolis, IN                     (Jan ‘13 – Current)

Role: Sr. Informatica (ETL) Developer

 

Indiana Family and Social Services Administration (FSSA) to provide application management support for the FSSA division of family Resources (DFR) and welfare reform Integrated Database (WRIDB) also known as the Social services Data Warehouse.

 

The data warehouse receives data files in various formats including text, Excel, mainframe physical sequential files, Access, SQL 2005. Tables and views. Informatica jobs typically load these files into staging tables. Specific sets of data are extracted from staging tables, transformed based on established business rules, requirements, DWH standards and then loaded into core tables of the data warehouse. Data from Core tables is summarized and loaded into aggregate tables.

 

Responsibilities:

 

·   Participated in daily/weekly meetings, monitored the work progresses of teams and proposed ETL strategies.

·   Used Informatica data services to profile and document the structure and quality of all data

·   Based on requirements designed and coded new/old Mappings and validated/debugged old Mappings, tested Workflows, Sessions and figured out the best technical solutions on old & New Mappings for Source/Target compatibility. Identified the bottlenecks in old Mappings and tuned them for better Performance.

·   Migrated the codes from Prod to Dev, Dev to Test, Test to Prod environment and wrote up the Team Based Development technical documents to the smooth transfer of the project. Prepared ETL technical Mapping documents along with test cases for each Mapping for future developments to maintain SDLC. Developed mappings to load into staging tables and then to Dimensions and Facts.

·   Used Mappings & Sessions Variables/Parameters, Parameter files, Reusable Transformations & Mapplets to maintain the life cycle development and fixed other's Mappings.

·   For each Mapping prepared effective Unit, Integration and System test cases for various stages to capture the data discrepancies/inaccuracies to ensure the successful execution of accurate data loading.

·   Worked with Informatica PowerCenter 9.5.1HF4 Tools- Repository Manager, Informatica Designer, Workflow Manager/Monitor and carefully monitored the system during data loading.

·   Designed the automation process of Sessions, Workflows, scheduled the Workflows, created Worklets (command, email, assignment, control, event wait/raise, conditional flows, etc.) and configured them according to business logics & requirements to load data from different Sources to Target.

·   Created Pre & Post-Sessions UNIX Scripts to merge the flat files and to create, delete temporary files, change the file name to reflect the file generated date etc.

·   Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions. Involved in fixing the invalid Mappings. Wrote various Functions, Triggers and Stored Procedures to drop, re-create the indexes and to solve the complex calculations.

 

Environment:  Informatica Power Center 9.5, Information Lifecycle Management (ILM) 6.2, Power Exchange 8.x, Oracle 11g, XML files, SQL Server 2008, Oracle Express, SQL Developer, Flat Files, Autosys, UNIX, Cognos, Business Objects XI reports.

 

Pacific Gas & Electric- San Ramon, CA                 (Jul‘12 – Dec‘12)

Role: Sr. Informatica Developer

 

Description: Pacific Gas and Electric Company, incorporated in California in 1905, is one of the largest combination natural gas and electric utilities in the United States. Based in San Francisco, the company is a subsidiary of PG&E Corporation. The company provides natural gas and electric service to approximately 16 million people throughout a 70,000-square-mile service area in northern and central California.

 

Working on a project named MDS and Endur retirement. Theses applications are being retired from old database in oracle and been brought to the new database in Teradata for efficient use of data for reporting and audit purposes.

 

Responsibilities:

·   Worked closely with business analyst and Data Warehouse architect to understand the source data and the need of the Warehouse.

·   Extensively used Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.

·   Prepared in various design documents like technical design documents, mapping specification documents, target to source mapping document.

·   Developed mappings using Informatica to load data from sources such as Oracle and SQL Server into the Teradata.

·   Extensively worked with transformations like Lookup, Update Strategy, Expression, Filter, Router, Joiner, and Aggregator, Source Qualifier Transformations.

·   Loaded data to the staging area using the Teradata parallel transporter.

·   Created Indexes, primary keys and checked other performance tuning at database level.

·   Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.

·   Involved in the process of enhancing the existing Informatica and bug fixing.

·   Developed Teradata Stored Procedures. Performance and fine Tuning of Teradata stored procedures.

·   Implemented complex mapping such as SLOWLY CHANGING DIMENSIONS (Type II).

·   Extensively worked with Lookup Caches like Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations.

·   Worked with reusable objects like Reusable Transformation and Mapplets

·   Extensively worked with Incremental Loading using Parameter Files, Mapping Variables and Mapping Parameters

·   Handled various loads like Intra Day Loads, Daily Loads, Weekly Loads, Monthly Loads, and Quarterly Loads using Incremental Loading Technique

·   Responsible for error handling using Session Logs, Reject Files, and Session Logs in the Workflow Monitor

 

Environment: Informatica Power Center 9.1 (Power Center Designer, workflow manager, workflow monitor), Oracle 11g, TERADATA, SQLServer2008, TOAD, SQL developer, Teradata SQL Assistant, Autosys.

 

TIAA-CREF - Charlotte, NC                                                                                (Oct ‘11 – Jun‘12)

Role: Sr. Informatica Developer

 

TIAA-CREF is a Non-Profit Financial Organization for over 90 years serving the Greater Good those in the academic, medical, cultural and research fields plan for and live in retirement. Project is about Data loading of GL BALANCES, Inflows/Outflows Transactions of Participant GL Account’s with the ETL process according to the Business Requirement Documents from corresponding Sources like the EDW, FDW, DRM & Manual Data files into respective Tables and Datasets.

 

Responsibilities:

 

·   Working on different tasks in Workflows like sessions, events raise, event wait, e-mail, command, worklets and scheduling of the workflow.

·   Creating sessions, configuring workflows to extract data from various sources, transforming data, and loading into an enterprise data warehouse.

·   Running and Monitoring daily scheduled jobs by using Work Load manager for supporting EDW(Enterprise Data Warehouse) loads for History as well as incremental data.

·   Design, Development and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.

·   Prepared data marts on policy data, policy coverage, claims data, client data and risk codes.

·   Extensively used Informatica PowerCenter 8.6 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.

·   Involved in design and development complex ETL mappings and stored procedures in an optimized manner. Used Power exchange for mainframe sources.

·   Involved in loading the data from Source Tables to ODS (Operational Data Source) and XML files using Transformation and Cleansing Logic using Informatica.

·   Based on the logic, used various transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, XML, Stored procedure transformations in the mapping.

·   Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.

·   Developed Informatica SCD type-I, Type-II and Type III mappings and tuned them for better performance. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, mapplets and others.

·   Snowflake Schema was mainly used with Geography, Customer, Product, and Time as basic dimensions.

·   Creating Test cases for Unit Test, System, Integration Test and UAT check the data quality.

·   Investigating failed jobs and writing SQL to debug data load issues in Production.

·   Writing SQL Scripts to extract the data from Database and for Testing Purposes.

·   Interacting with the Source Team and Business to get the Validation of the data.

·   Supported the code after post production deployment.

 

 

Environment: Informatica Power Center 8.6.1 /8.1.1, Oracle Business Intelligence (OBIEE), Oracle 10g/9g , MS SQL Server 2008, TOAD for SQL Server, Flat Files, PL/SQL, Windows 2000, XML.

 

Customer Portfolios- Boston, MA     (Apr’10 – Sep ‘11)  

 

Role: Informatica Developer

Customer Portfolios deliver coordinated, multi-channeled marketing that is targeted, measurable, and timed to the precise moment for the individual customer. Provides marketing services and technology to help clients execute Lights-Out Marketing programs. It is through the utilization of customer insight and our business platform that Customer Portfolios are able to create intelligent Lights-Out Marketing programs. It's a privately held company with 50 employees.

 

Responsibilities:

         Interacted with Business analyst, SAP developers, Clarify (Customer service team) developers and SAP Functional consultants to identify the business requirements and data realties.

         Worked with Functional consultant and SAP developers to design the function modules to extract the data from SAP R3 systems.

         Used Pentaho Data Integration Designer to create ETL transformations

         Extensively used Informatica to load source data from file systems to ODS and SAP BW, R3 & POSDM using Informatica Power Connect and created reusable transformations (Joiner, Routers, Lookups, Rank, Filter, Expression and Aggregator) inside a mapplet and created new mappings using the Designer module of Informatica Power Center to implement the business logic and to load the customer healthcare data incrementally.

         Created Complex mappings using Unconnected Lookup, and Aggregate and Router transformations for populating target table in an efficient manner.

         Used Shortcuts (Global/Local) to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

         Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

         Extensively used Experience in working with SAP Developers to create Function modules or transparent (Z Table in SAP) tables for Migration /Loading of Data from SAP R3, SAP BW to Oracle and then loading to external targets like SQL Server and Flat files.

         Used BCI connect and upload the data into SAP thru Informatica.

         Worked on SQL tools like PLSQL Developer/SQL Plus/MS SQL Visual studio to run SQL queries and validate the data.

         Designed SSIS Packages to transfer data from flat files to SQL Server using Business Intelligence Development Studio.

         Worked along with SAP Functional team on SAP Solution manager and service manager tool for transferring or Transporting programs from one level to another level in SAP (Landscape like Development, UAT, Regression testing and Production)

         Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of the repository and folder.

         Extensively work on the performance tuning of Informatica PowerCenter Mappings as well as the tuning of the sessions by creating multiple sessions on primary key values by using partition query.

         Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings

         Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

         Involved in designing, developing and deploying reports in MS SQL Server environment using SSRS 2008 and SSIS in Business Intelligence Development Studio (BIDS)

         Designed and developed a series of complex Business Intelligence solutions using Pentaho Report Designer

         Created various tasks like Session, Command, Timer and Event wait.

         Migrated mappings, sessions, and workflows from Development to Testing and then to Production environments.

         Performed unit, integration and system level performance testing. Associated with the production support team in various performances related issues. Created testing documents for the migration of SAP programs to the next level for the approval of the SAP functional team.

         Prepared the complete data mapping for all the migrated jobs using SSIS.

         Provided production support by monitoring the processes running daily.

         Scheduled the Jobs by using Informatica scheduler.

 

 

Environment: Informatica PowerCenter 7.1, Informatica PowerConnect for SAP R3, SAP CRM. Oracle 10i, Erwin, Windows Server 2005, Flat Files, XML Files, Business Objects, UNIX, Perl Scripts, MS Access.

 

 

Hewlett Packard- Bangalore, India       (Feb ’07 – Mar ‘10)  

Role: Informatica Developer

The scope of the Project is to provide the regional trade a tool by which they are able to analyze their promotions and track the success of each promotion that goes live.

 

It is also a tool for the management to assess the trade identify the successful promotions and provide assistance in creating similar promotions that have a proven track record of success.

 

It will assist the Forecast Planners in planning the production of items to meet the forecasted demand. The Transactional system has been updated to allow the trade to create trade promotions according to P&G requirements with forecasts for both volume and funds. The TPM project will use this information, including actual shipment data and consumption data from other sources to monitor the promotions.

 

Responsibilities:

 

·   Extensively used Informatica to extract data from sources such as OLTP systems (i.e. Siebel OLTP System) and finally load in the Data Warehouse to support analysis and reporting.

·   Worked on Informatica Power Center tools - Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager and Informatica Server Manager.

·   Extensively used PL/SQL Procedures/Functions to build business rules.

·   The Informatica Metadata repository was created using the Repository Manager as a hub for interaction between the various tools.

·   Designed and Developed complex aggregate, join, look up transformation rules (business rules) to generate consolidated (Fact/Summary) data identified by dimensions using Informatica Power Center tools.

·   Created the mappings using Transformations such as the Source qualifier, Aggregator, Expression, Lookup, Router, Filter, Normalizer and Update Strategy.

·   Extensively involved in data modeling and Production Support.

 

Environment: Informatica 6.2, Oracle 9i, OLTP, PL/SQL, SQL Server, DB2, MS Excel, Siebel Analytics.

 

 

 

Educational Qualification:

         Bachelor of Technology in Computer Science from JNTU



Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Work Status:

US - I am authorized to work in this country for any employer.

 

 

Target Company:

Company Size:

 

Target Locations:

Selected Locations:

US-IN-Indianapolis